Discovery of Huffman codes

نویسندگان

  • Inna Pivkina
  • Gary Stix
  • DAVID A. HUFFMAN
  • David A. Huffman
چکیده

Large networks of IBM computers use it. So do high-definition television, modems and a popular electronic device that takes the brain work out of programming a videocassette recorder. All these digital wonders rely on the results of a 40-year-old term paper by a modest Massachusetts Institute of Technology graduate student—a data compression scheme known as Huffman encoding. In 1951 David A. Huffman and his classmates in an electrical engineering graduate course on information theory were given the choice of a term paper or a final exam. For the term paper, Huffman’s professor, Robert M. Fano, had assigned what at first appeared to be a simple problem. Students were asked to find the most efficient method of representing numbers, letters or other symbols using a binary code. Besides being a nimble intellectual exercise, finding such a code would enable information to be compressed for transmission over a computer network or for storage in a computer’s memory. Huffman worked on the problem for months, developing a number of approaches, but none that he could prove to be the most efficient. Finally, he despaired of ever reaching a solution and decided to start studying for the final. Just as he was throwing his notes in the garbage, the solution came to him. “It was the most singular moment of my life,” Huffman says. “There was the absolute lightning of sudden realization.” That epiphany added Huffman to the legion of largely anonymous engineers whose innovative thinking forms the technical underpinnings for the accoutrements of modem living—in his case, from facsimile machines to modems and a myriad of other devices. “Huffman code is one of the fundamental ideas that people in computer science and data communications are using all the time,” says Donald E. Knuth of Stanford University, who is the author of the multivolume series The Art of Computer Programming. Huffman says he might never have tried his hand at the problem—much less solved it at the age of 25—if he had known that Fano, his professor, and Claude E. Shannon, the creator of information theory, had struggled with it. “It was my luck to be there at the right time and also not have my professor discourage me by telling me that other good people had struggled with this problem,” he says.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Decoding prefix codes

Minimum-redundancy prefix codes have been a mainstay of research and commercial compression systems since their discovery by David Huffman more than 50 years ago. In this experimental evaluation we compare techniques for decoding minimum-redundancy codes, and quantify the relative benefits of recently developed restricted codes that are designed to accelerate the decoding process. We find that ...

متن کامل

Non binary huffman code pdf

A Method for the Construction of Minimum-Redundancy Codes PDF.HUFFMAN CODES. Corollary 28 Consider a coding from a length n vector of source symbols, x x1x2.xn, to a binary codeword of length lx. Then the.Correctness of the Huffman coding nitro pdf reader 32 bit 1 1 1 13 create pdf files algorithm. A binary code encodes each character as a binary. Code that encodes the file using as few bits as...

متن کامل

A Two-phase Practical Parallel Algorithm for Construction of Huffman Codes

The construction of optimal prefix codes plays a significant and influential role in applications concerning information processing and communication. For decades, different algorithms were proposed treating the issue of Huffman codes construction and various optimizations were introduced. In this paper we propose a detailed practical time-efficient parallel algorithm for generating Huffman cod...

متن کامل

Introduction to Data Compression

3 Probability Coding 10 3.1 Prefix Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 3.1.1 Relationship to Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 3.2 Huffman Codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 3.2.1 Combining Messages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 3.2.2 Minim...

متن کامل

A Comparative Complexity Study of Fixed-to-variable Length and Variable-to-fixed Length Source Codes

In this paper we present an analysis of the storage complexity of Huffman codes, Tunstall codes and arithmetic codes in various implementations and relate this to the achieved redundancies. It turns out that there exist efficient implementations of both Huffman and Tunstall codes and that their approximations result in arithmetic codes. Although not optimal, the arithmetic codes still have a be...

متن کامل

A new bound for the data expansion of Huffman codes

It is proven that for every random variable with a countably infinite set of outcomes and finite entropy there exists an optimal prefix code which can be constructed from Huffman codes for truncated versions of the random variable, and that the average lengths of any sequence of Huffman codes for the truncated versions converge to that of the optimal code. Also, it is shown that every optimal i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008